Some equivalences between Shannon entropy and Kolmogorov complexity

نویسندگان

  • Sik K. Leung-Yan-Cheong
  • Thomas M. Cover
چکیده

[24] A. N. Kolmogorov, “On the approximation of distributions of 1271 G. H. Hardy, J. E. Litt lewood, and G. Polya, ZnequaIit ies. New sums of independent summa nds by infinitely divisible distributions,” San/&v& vol. 25, pp. 159-174, 1963. [28] York and London: Cambridge Univ. Press? 1934. D. E. Da kin and C. J. Eliezer, “General ization of Holder’s and (251 A. Renyi, “On the amount of missing information and the Minkows & s inequalities,” Proc. Cambridge Phil. Sot., vol. 64, pp. Neyman-Pearson lemma,” in Research Papers in Statistics, David, 1023-1027, 1968. Ed. New York: W iley, 1966, pp. 281-288. [29] L. Kanal, “Patterns in attem recognition,” IEEE Trans. Znform. [26] G. T. Toussaint, “Some upper bounds on error probability for % multiclass pattern recognition,” IEEE Trans. Comput., C-20, pp. [30] Theory, vol. IT-20, no. pp. 697-722, 1974. C. E. Shannon, “A mathematical theory of communicat ion,” Bell 943-944, 1971. System Tech. J., vol. 27, pp. 379-423, 623-656, 1948.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Inequalities for Shannon Entropy and Kolmogorov Complexity

It was mentioned by Kolmogorov (1968, IEEE Trans. Inform. Theory 14, 662 664) that the properties of algorithmic complexity and Shannon entropy are similar. We investigate one aspect of this similarity. Namely, we are interested in linear inequalities that are valid for Shannon entropy and for Kolmogorov complexity. It turns out that (1) all linear inequalities that are valid for Kolmogorov com...

متن کامل

Shannon Entropy vs. Kolmogorov Complexity

Most assertions involving Shannon entropy have their Kolmogorov complexity counterparts. A general theorem of Romashchenko [4] states that every information inequality that is valid in Shannon’s theory is also valid in Kolmogorov’s theory, and vice verse. In this paper we prove that this is no longer true for ∀∃-assertions, exhibiting the first example where the formal analogy between Shannon e...

متن کامل

Entropy Measures vs. Kolmogorov Complexity

Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recursive probability distribution, the expected value of Kolmogorov complexity equals its Shannon entropy, up to a constant. We study if a similar relationship holds for Rényi and Tsallis entropies of order α, showing that it only holds for α = 1. Regarding a time-bounded analogue relationship, we s...

متن کامل

Information Distances versus Entropy Metric

Information distance has become an important tool in a wide variety of applications. Various types of information distance have been made over the years. These information distance measures are different from entropy metric, as the former is based on Kolmogorov complexity and the latter on Shannon entropy. However, for any computable probability distributions, up to a constant, the expected val...

متن کامل

Shannon Information and Kolmogorov Complexity

We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and where they are fundamentally different. We discuss and relate the basic notions of both theories: Shannon entropy versus Kolmogorov complexity, the relation of both to universal coding, Shannon mutual information versus Kolmogorov (‘algorithmic’) mutual inform...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE Trans. Information Theory

دوره 24  شماره 

صفحات  -

تاریخ انتشار 1978